55 research outputs found

    BiodiversitĂ€t im Weinbau – Was ist im Weinbau möglich?

    Get PDF
    Die Zahl der Insekten hat in den vergangenen Jahrzehnten in Deutschland deutlich abgenommen (BfN 2017). Die Rote Liste der Wildbienen belegt beispielsweise, dass von den ca. 560 Wildbienenarten inzwischen 41 % als bestandsgefĂ€hrdet einzustufen sind (Westrich et al. 2011). Der RĂŒckgang der Insektenpopulationen und der Verlust von Insektenarten fĂŒhren zu einer in der Agrarlandschaft deutlich sichtbaren Abnahme der Insektenbiomasse um bis zu 80 % (Sorg 2013; Schwenninger & Scheuchl 2016). Die Bayerische Landesanstalt fĂŒr Weinbau und Gartenbau startete in 2014 ein Projekt am ThĂŒngersheimer Scharlachberg mit dem Ziel, eine Weinbergslage mit der höchstmöglichen BiodiversitĂ€t aufzubauen. In einem geplanten Forschungsprojekt sollen in der Folge vergleichende Untersuchungen mit anderen Weinlagen in Franken die Unterschiede in der BiodiversitĂ€t aufzeigen

    Challenges for Monocular 6D Object Pose Estimation in Robotics

    Full text link
    Object pose estimation is a core perception task that enables, for example, object grasping and scene understanding. The widely available, inexpensive and high-resolution RGB sensors and CNNs that allow for fast inference based on this modality make monocular approaches especially well suited for robotics applications. We observe that previous surveys on object pose estimation establish the state of the art for varying modalities, single- and multi-view settings, and datasets and metrics that consider a multitude of applications. We argue, however, that those works' broad scope hinders the identification of open challenges that are specific to monocular approaches and the derivation of promising future challenges for their application in robotics. By providing a unified view on recent publications from both robotics and computer vision, we find that occlusion handling, novel pose representations, and formalizing and improving category-level pose estimation are still fundamental challenges that are highly relevant for robotics. Moreover, to further improve robotic performance, large object sets, novel objects, refractive materials, and uncertainty estimates are central, largely unsolved open challenges. In order to address them, ontological reasoning, deformability handling, scene-level reasoning, realistic datasets, and the ecological footprint of algorithms need to be improved.Comment: arXiv admin note: substantial text overlap with arXiv:2302.1182

    Investigation of commercial graphenes

    Get PDF
    For graphene to achieve its full scientific and commercial potential, reliable mass production of the material on the multi-tonne scale is essential. We have investigated five samples of graphene obtained from commercial sources that state they can supply the product on the tonne scale per annum. From electron microscopy at the micrometre to the nanometre scale, and neutron vibrational spectroscopy, we find that none of the materials examined were 100 % isolated graphene sheets. In all cases, there was a substantial content of graphite-like material. The samples exhibited varying oxygen contents, this could be present as carboxylic acid (although other oxygenates, quinones, phenols may also be present) or water. We emphasise that INS spectroscopy is particularly useful for the investigation of inorganic materials that will be used commercially: it provides atomic scale information from macroscopic (10's of g) amounts of sample, thus ensuring that the results are truly representative

    Worst-Case Energy Consumption Analysis for Energy-Constrained Embedded Systems

    Full text link
    Abstract—The fact that energy is a scarce resource in many embedded real-time systems creates the need for energy-aware task schedulers, which not only guarantee timing constraints but also consider energy consumption. Unfortunately, existing approaches to analyze the worst-case execution time (WCET) of a task usually cannot be directly applied to determine its worst-case energy consumption (WCEC) due to execution time and energy consumption not being closely correlated on many state-of-the-art processors. Instead, a WCEC analyzer must take into account the particular energy characteristics of a target platform. In this paper, we present 0g, a comprehensive approach to WCEC analysis that combines different techniques to speed up the analysis and to improve results. If detailed knowledge about the energy costs of instructions on the target platform is available, our tool is able to compute upper bounds for the WCEC by statically analyzing the program code. Otherwise, a novel ap-proach allows 0g to determine the WCEC by measurement after having identified a set of suitable program inputs based on an auxiliary energy model, which specifies the energy consumption of instructions in relation to each other. Our experiments for three target platforms show that 0g provides precise WCEC estimates. I

    Treatment of Early Breast Cancer Patients: Evidence, Controversies, Consensus: Focusing on Systemic Therapy - German Experts' Opinions for the 16th International St. Gallen Consensus Conference (Vienna 2019)

    Get PDF
    A German working group of leading breast cancer experts have discussed the votes at the International St. Gallen Consensus Conference in Vienna for the treatment of primary breast cancer with regard to the German AGO (Ar-beitsgemeinschaft Gynakologische Onkologie) recommendations for clinical practice in Germany. Three of the German breast cancer experts were also members of this year's St. Gallen panel. Comparing the St. Gallen recommendations with the annually updated treatment recommendations of the Gynecological Oncology Working Group (AGO Mamma 2019) and the German S3 Guideline is useful, because the recommendations of the St. Gallen panel are based on expert opinions of different countries and disciplines. The focus of this article is on systemic therapy. The motto of this year's 16th St. Gallen Consensus Conference was Estimating the magnitude of clinical benefit. The rationale behind this motto is that, for every treatment decision, a benefit-risk assessment must be taken into consideration for each patient

    The GRAVITY+ Project: Towards All-sky, Faint-Science, High-Contrast Near-Infrared Interferometry at the VLTI

    Full text link
    The GRAVITY instrument has been revolutionary for near-infrared interferometry by pushing sensitivity and precision to previously unknown limits. With the upgrade of GRAVITY and the Very Large Telescope Interferometer (VLTI) in GRAVITY+, these limits will be pushed even further, with vastly improved sky coverage, as well as faint-science and high-contrast capabilities. This upgrade includes the implementation of wide-field off-axis fringe-tracking, new adaptive optics systems on all Unit Telescopes, and laser guide stars in an upgraded facility. GRAVITY+ will open up the sky to the measurement of black hole masses across cosmic time in hundreds of active galactic nuclei, use the faint stars in the Galactic centre to probe General Relativity, and enable the characterisation of dozens of young exoplanets to study their formation, bearing the promise of another scientific revolution to come at the VLTI.Comment: Published in the ESO Messenge

    The ABC130 barrel module prototyping programme for the ATLAS strip tracker

    Full text link
    For the Phase-II Upgrade of the ATLAS Detector, its Inner Detector, consisting of silicon pixel, silicon strip and transition radiation sub-detectors, will be replaced with an all new 100 % silicon tracker, composed of a pixel tracker at inner radii and a strip tracker at outer radii. The future ATLAS strip tracker will include 11,000 silicon sensor modules in the central region (barrel) and 7,000 modules in the forward region (end-caps), which are foreseen to be constructed over a period of 3.5 years. The construction of each module consists of a series of assembly and quality control steps, which were engineered to be identical for all production sites. In order to develop the tooling and procedures for assembly and testing of these modules, two series of major prototyping programs were conducted: an early program using readout chips designed using a 250 nm fabrication process (ABCN-25) and a subsequent program using a follow-up chip set made using 130 nm processing (ABC130 and HCC130 chips). This second generation of readout chips was used for an extensive prototyping program that produced around 100 barrel-type modules and contributed significantly to the development of the final module layout. This paper gives an overview of the components used in ABC130 barrel modules, their assembly procedure and findings resulting from their tests.Comment: 82 pages, 66 figure
    • 

    corecore